Transparency and Surveillance as Sociotechnical Accountability by Deborah G. Johnson Priscilla M. Regan

Transparency and Surveillance as Sociotechnical Accountability by Deborah G. Johnson Priscilla M. Regan

Author:Deborah G. Johnson, Priscilla M. Regan [Deborah G. Johnson, Priscilla M. Regan]
Language: eng
Format: epub
ISBN: 9781138790735
Barnesnoble:
Publisher: Taylor & Francis
Published: 2014-07-14T00:00:00+00:00


Normative Landscape

The house-of-mirrors analysis of Google’s search engine operations shows that simple and seemingly innocuous data (i.e., a term or terms submitted for a search) are combined with other data about the user’s past searches and with data about other users to produce a rich and revealing account of the user, an account potentially indicative of the user’s private thoughts, concerns, and interests—her inner life. As this transformation occurs, there is a gap between users’ understanding of what they are doing and what Google is doing. This gap is what defines the normative landscape of Google search. Users submit search terms, seemingly alone in the privacy of their own space, and, although there are no obvious signs of it, they engage with multiple actors (inside and, potentially, outside Google) that produce accounts of them for those actors’ own particular purposes. Although Google would never characterize itself as a system of surveillance, it collects enormous amounts of information about its users, uses the information to develop accounts of them, and then delivers results (search results and advertisements) based on the accounts. This is a form of surveillance and accountability, and yet users experience Google as something similar to the Yellow Pages or even a seamless repository of knowledge. The norms that govern users’ actions (knowledge seeking, inquiry, curiosity) therefore contrast sharply with the norms governing Google’s extensive mining of users’ data.

Returning to the vignette at the beginning of this chapter, the string of search terms alone suggests the revelatory power of the data that users provide to Google. In Google’s house of mirrors those simple terms—revealing in themselves—are, when combined with other data, transformed into detailed search histories of individual users. Of course, Google could argue that its interest in users is only an interest in delivering relevant search results. Google could also argue that its use of user data is targeted to identify patterns of behavior in groups of users. Moreover, Google could argue that no person looks at the search histories of particular users. Google’s operations are automated; search histories are generated and used computationally, so there is no person watching another person. Each of these defenses may be true, but none of them refutes the claim that Google is a system of surveillance and accountability. It collects data on users, keeps track of their activities, and then makes decisions on the basis of accounts it develops. Users are, in this sense, held accountable for their behavior.

The experience and frame of mind of users stands in stark contrast to Google’s operation. The experience of the user is that of submitting discrete search terms and interacting with a machine and software, yet in the house of mirrors Google accumulates search history data, interprets patterns of behavior, and uses algorithms to figure out what to return in the way of both search results and advertising. Users engage with Google’s search engine as if they were engaging with a directory of websites matched to their interests (and mediated by the search terms that the user submits).



Download



Copyright Disclaimer:
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.